A Lipschitz Method for Accelerated

نویسندگان

  • Barton T. Stander
  • John C. Hart
چکیده

Interpolating discrete volume data into a continuous form adapts implicit surface techniques for rendering volumetric iso-surfaces. One such algorithm uses the Lipschitz condition to create an octree representation that accelerates volume rendering. Furthermore, only one preprocessing step is needed to create the Lipschitz-octree representation that accelerates rendering of iso-surfaces for any threshold value.

منابع مشابه

An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods

This paper presents an accelerated variant of the hybrid proximal extragradient (HPE) method for convex optimization, referred to as the accelerated HPE (A-HPE) framework. Iterationcomplexity results are established for the A-HPE framework, as well as a special version of it, where a large stepsize condition is imposed. Two specific implementations of the A-HPE framework are described in the co...

متن کامل

Accelerated Mirror Descent in Continuous and Discrete Time

We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original continuous-time motivation of mirror descent with a recent ODE interpretation of Nesterov’s accelerated method, we propose a family of continuous-time descent dynamics for convex functions with Lipschitz gradients, such that the solution trajectories converge to the optimum at a O(1/t2) rate. We...

متن کامل

Accelerated gradient sliding for structured convex optimization

Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show that the AGS method can skip the gradient...

متن کامل

Accelerated Methods for Non-Convex Optimization

We present an accelerated gradient method for non-convex optimization problems with Lipschitz continuous first and second derivatives. The method requires time O( −7/4 log(1/ )) to find an -stationary point, meaning a point x such that ‖∇f(x)‖ ≤ . The method improves upon the O( −2) complexity of gradient descent and provides the additional second-order guarantee that ∇f(x) −O( )I for the compu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

متن کامل
عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994